Using Kernel PCA for Initialisation of Nonlinear Factor Analysis

نویسندگان

  • Antti Honkela
  • Stefan Harmeling
  • Leo Lundqvist
  • Harri Valpola
چکیده

The nonlinear factor analysis (NFA) method by Lappalainen and Honkela (2000) [2] is initialised with linear principal component analysis (PCA). Because of the multilayer perceptron (MLP) network used to model the nonlinearity, the method is susceptible to local minima and therefore sensitive to the initialisation used. As the method is used for nonlinear separation, the linear initialisation may in some cases lead it astray. In this report we study using kernel PCA (KPCA) to initialise NFA. KPCA is a rather straightforward generalisation of linear PCA and it is much faster to compute than NFA. The experiments show that it may produce significantly better initialisations than linear PCA, although finding a suitable kernel and parameters may be difficult.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Using Kernel PCA for Initialisation of Variational Bayesian Nonlinear Blind Source Separation Method

The variational Bayesian nonlinear blind source separation method introduced by Lappalainen and Honkela in 2000 is initialised with linear principal component analysis (PCA). Because of the multilayer perceptron (MLP) network used to model the nonlinearity, the method is susceptible to local minima and therefore sensitive to the initialisation used. As the method is used for nonlinear separatio...

متن کامل

Gabor feature-based apple quality inspection using kernel principal component analysis

Automated inspection of apple quality involves computer recognition of good apples and blemished apples based on geometric or statistical features derived from apple images. This paper introduces a Gabor feature-based kernel principal component analysis (PCA) method by combining Gabor wavelet representation of apple images and the kernel PCA method for apple quality inspection using near-infrar...

متن کامل

Kernel PCA for Feature Extraction and De - Noising in 34 Nonlinear Regression

39 40 41 In this paper, we propose the application of the 42 Kernel Principal Component Analysis (PCA) tech43 nique for feature selection in a high-dimensional 44 feature space, where input variables are mapped by 45 a Gaussian kernel. The extracted features are 46 employed in the regression problems of chaotic 47 Mackey–Glass time-series prediction in a noisy 48 environment and estimating huma...

متن کامل

Hyperparameter Selection in Kernel Principal Component Analysis

In kernel methods, choosing a suitable kernel is indispensable for favorable results. No well-founded methods, however, have been established in general for unsupervised learning. We focus on kernel Principal Component Analysis (kernel PCA), which is a nonlinear extension of principal component analysis and has been used electively for extracting nonlinear features and reducing dimensionality. ...

متن کامل

Hyperspectral Image Compression and Target Detection Using Nonlinear Principal Component Analysis

The widely used principal component analysis (PCA) is implemented in nonlinear by an auto-associative neural network. Compared to other nonlinear versions, such as kernel PCA, such a nonlinear PCA has explicit encoding and decoding processes, and the data can be transformed back to the original space. Its data compression performance is similar to that of PCA, but data analysis performance such...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003